|
The idea of "irreversibility" is central to the understanding of entropy. Everyone has an intuitive understanding of irreversibility (a dissipative process) - if one watches a movie of everyday life running forward and in reverse, it is easy to distinguish between the two. The movie running in reverse shows impossible things happening - water jumping out of a glass into a pitcher above it, smoke going down a chimney, water in a glass freezing to form ice cubes, crashed cars reassembling themselves, and so on. The intuitive meaning of expressions such as "you can't unscramble an egg", "don't cry over spilled milk" or "you can't take the cream out of the coffee" is that these are irreversible processes. There is a direction in time by which spilled milk does not go back into the glass. In thermodynamics, one says that the "forward" processes – pouring water from a pitcher, smoke going up a chimney, etc. – are "irreversible": they cannot happen in reverse, even though, on a microscopic level, no laws of physics would be violated if they did. All real physical processes involving systems in everyday life, with many atoms or molecules, are irreversible. For an irreversible process in an isolated system, the thermodynamic state variable known as entropy is always increasing. The reason that the movie in reverse is so easily recognized is because it shows processes for which entropy is decreasing, which is physically impossible. In everyday life, there may be processes in which the increase of entropy is practically unobservable, almost zero. In these cases, a movie of the process run in reverse will not seem unlikely. For example, in a 1-second video of the collision of two billiard balls, it will be hard to distinguish the forward and the backward case, because the increase of entropy during that time is relatively small. In thermodynamics, one says that this process is practically "reversible", with an entropy increase that is practically zero. The statement of the fact that the entropy of the Universe never decreases is found in the second law of thermodynamics. In a physical system, entropy provides a measure of the amount of thermal energy that ''cannot'' be used to do work. In some other definitions of entropy, it is a measure of how evenly energy (or some analogous property) is distributed in a system. ''Work'' and ''heat'' are determined by a process that a system undergoes, and only occur at the boundary of a system. ''Entropy'' is a function of the state of a system, and has a value determined by the state variables of the system. The concept of entropy is central to the second law of thermodynamics. The second law determines which physical processes can occur. For example, it predicts that the flow of heat from a region of high temperature to a region of low temperature is a spontaneous process – it can proceed along by itself without needing any extra external energy. When this process occurs, the hot region becomes cooler and the cold region becomes warmer. Heat is distributed more evenly throughout the system and the system's ability to do work has decreased because the temperature difference between the hot region and the cold region has decreased. Referring back to our definition of entropy, we can see that the entropy of this system has increased. Thus, the second law of thermodynamics can be stated to say that the entropy of an isolated system always increases, and such processes which increase entropy can occur spontaneously. The entropy of a system increases as its components have the range of their momentum and/or position increased. The term ''entropy'' was coined in 1865 by the German physicist Rudolf Clausius, from the Greek words ''en-'', "in", and ''trope'' "a turning", in analogy with ''energy''. ==Explanation== The concept of thermodynamic entropy arises from the second law of thermodynamics. By this law of entropy increase it quantifies the reduction in the capacity of a system for change, for example heat always flows from a region of higher temperature to one with lower temperature until temperature becomes uniform or determines whether a thermodynamic process may occur. Entropy is calculated in two ways, the first is the entropy change (ΔS) to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). It is based on the macroscopic relationship between heat flow into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system. The second calculates the absolute entropy (S) of a system based on the microscopic behaviour of its individual particles. This is based on the natural logarithm of the number of microstates possible in a particular macrostate (W or Ω) called the thermodynamic probability. Roughly the probability of the system being in that state. In this sense it effectively defines entropy independently from its effects due to changes which may involve heat, mechanical, electrical, chemical energies etc. but also includes logical states such as information. Following the formalism of Clausius, the first calculation can be mathematically stated as:〔I. Klotz, R. Rosenberg, ''Chemical Thermodynamics - Basic Concepts and Methods'', 7th ed., Wiley (2008), p. 125〕 : 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Introduction to entropy」の詳細全文を読む スポンサード リンク
|